An inexact Riemannian proximal gradient method

نویسندگان

چکیده

This paper considers the problem of minimizing summation a differentiable function and nonsmooth on Riemannian manifold. In recent years, proximal gradient method its variants have been generalized to setting for solving such problems. Different approaches generalize mapping lead different versions methods. However, their convergence analyses all rely exactly, which is either too expensive or impracticable. this paper, we study an inexact method. It proven that if solved sufficiently accurately, then global local rate based Kurdyka–Łojasiewicz property can be guaranteed. Moreover, practical conditions accuracy are provided. As byproduct, Stiefel manifold proposed in Chen et al. [SIAM J Optim 30(1):210–239, 2020] viewed as provided certain accuracy. Finally, numerical experiments sparse principal component analysis conducted test conditions.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inexact proximal stochastic gradient method for convex composite optimization

We study an inexact proximal stochastic gradient (IPSG) method for convex composite optimization, whose objective function is a summation of an average of a large number of smooth convex functions and a convex, but possibly nonsmooth, function. Variance reduction techniques are incorporated in the method to reduce the stochastic gradient variance. The main feature of this IPSG algorithm is to a...

متن کامل

An inexact proximal method for quasiconvex minimization

In this paper we propose an inexact proximal point method to solve constrained minimization problems with locally Lipschitz quasiconvex objective functions. Assuming that the function is also bounded from below, lower semicontinuous and using proximal distances, we show that the sequence generated for the method converges to a stationary point of the problem.

متن کامل

An Inexact Accelerated Proximal Gradient Method for Large Scale Linearly Constrained Convex SDP

The accelerated proximal gradient (APG) method, first proposed by Nesterov for minimizing smooth convex functions, later extended by Beck and Teboulle to composite convex objective functions, and studied in a unifying manner by Tseng, has proven to be highly efficient in solving some classes of large scale structured convex optimization (possibly nonsmooth) problems, including nuclear norm mini...

متن کامل

An inexact and nonmonotone proximal method for smooth unconstrained minimization

An implementable proximal point algorithm is established for the smooth nonconvex unconstrained minimization problem. At each iteration, the algorithm minimizes approximately a general quadratic by a truncated strategy with step length control. The main contributions are: (i) a framework for updating the proximal parameter; (ii) inexact criteria for approximately solving the subproblems; (iii) ...

متن کامل

An inexact proximal point method for solving generalized fractional programs

In this paper, we present several new implementable methods for solving a generalized fractional program with convex data. They are Dinkelbach-type methods where a prox-regularization term is added to avoid the numerical difficulties arising when the solution of the problem is not unique. In these methods, at each iteration a regularized parametric problem is solved inexactly to obtain an appro...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Computational Optimization and Applications

سال: 2023

ISSN: ['0926-6003', '1573-2894']

DOI: https://doi.org/10.1007/s10589-023-00451-w